Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
235GB Pretraining
# 235GB Pretraining
Jasmine 350M
JASMINE is a series of Arabic GPT models designed for few-shot learning, with parameters ranging from 300 million to 6.7 billion, pretrained on 235GB of text data.
Large Language Model
Transformers
J
UBC-NLP
81
5
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase